home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Night Owl 9
/
Night Owl CD-ROM (NOPV9) (Night Owl Publisher) (1993).ISO
/
032a
/
lnn0108.zip
/
LNN1.008
Wrap
Text File
|
1993-06-03
|
67KB
|
1,247 lines
▒▒▄ ▒▒▒▒▒▒▄ ▒▒▒▒▒▒▄ ▒▒▒▒▒▒▄ ▒▒▄ ▒▒▒▄▄ ▒▒▄ ▒▒▒▒▒▒▄ ▒▒▒▒▒▒▄
▒▒█ ▒▒█▀▀▀▀ ▒▒█▀▀▀▀ ▒▒█▀▒▒█ ▒▒█ ▒▒█▒▒█▒▒█ ▒▒█▀▀▀▀ ▀▒▒█▀▀
▒▒█ ▒▒▒▒▒▄ ▒▒█▒▒▒▄ ▒▒▒▒▒▒█ ▒▒█ ▒▒█ ▀▒▒▒█ ▒▒▒▒▒▄ ▒▒█
▒▒█ ▒▒█▀▀▀ ▒▒█ ▒▒█ ▒▒█ ▒▒█ ▒▒█ ▒▒█ ▒▒█ ▒▒█▀▀▀ ▒▒█
▒▒▒▒▒▒▄ ▒▒▒▒▒▒▄ ▒▒▒▒▒▒█ ▒▒█ ▒▒█ ▒▒▒▒▒▒▄ ▒▒█ ▒▒█ ▒▒▒▒▒▒▄ ▒▒█
▀▀▀▀▀▀ ▀▀▀▀▀▀ ▀▀▀▀▀▀ ▀▀ ▀▀ ▀▀▀▀▀▀ ▀▀ ▀▀ ▀▀▀▀▀▀ ▀▀
▒▒▒▄▄ ▒▒▄ ▒▒▒▒▒▒▄ ▒▒▄ ▒▒▄ ▒▒▒▒▒▒▄
▒▒█▒▒█▒▒█ ▒▒█▀▀▀▀ ▒▒█ ▒▒█ ▒▒█▀▀▀▀
▒▒█ ▀▒▒▒█ ▒▒▒▒▒▄ ▒▒█ ▒▒█ ▒▒▒▒▒▒▄
▒▒█ ▒▒█ ▒▒█▀▀▀ ▒▒█▒▒▄▒▒█ ▀▀▀▒▒█
▒▒█ ▒▒█ ▒▒▒▒▒▒▄ ▀▒▒▒▒█▀▀ ▒▒▒▒▒▒█
▀▀ ▀▀ ▀▀▀▀▀▀ ▀▀▀▀ ▀▀▀▀▀▀
Legal Net Newsletter
Volume 1, Issue 8 -- June 3, 1993
Legal Net Newsletter is dedicated to providing information
on the legal issues of computing and networking in the 1990's
and into the future.
The information contained in this newsletter is not to be
misconstrued as a bona fide legal document, nor is it to be taken
as an advocacy forum for topics discussed and presented herein.
The information contained within this newsletter has been
collected from several governmental institutions, computer
professionals and third party sources. Opinion and ideological
excerpts have been collected from many sources with prior approval.
"Legal Net News", "Legal Net Newsletter"
and the Legal Net News logo are
Copyright (c) 1993 Paul Ferguson -- All rights reserved.
This newsletter may be freely copied and distributed in its entirety.
Singular items contained within this newsletter may also be
freely copied and distributed, with the exception of
individual copyrighted items which appear with
the prior approval of the originating author.
Legal Net News can be found at the following locations:
Publicly Accessible BBS's
-------------------------
The SENTRY Net BBS Arlington Software Exchange
Centreville, Virginia USA Arlington, Virginia USA
+1-703-815-3244 +1-703-532-7143
To 9,600 bps To 9,600 bps
The Internet
------------
tstc.edu (161.109.128.2) Directory: /pub/legal-net-news
Login as ANONYMOUS and use your net ID (for example: fergp@sytex.com)
as the password. Or send e-mail to
postmaster@tstc.edu
E-mail submissions, comments and editorials to: fergp@sytex.com
- --
In this issue -
o CPSR files suit against the NSA for "Clipper" info
o EFF comments to the NIST
o The RISKS of teaching about computers and the law
o Case History: "Some 'Property' Problems in a Computer Crime
Prosecution," by Mike Godwin
o CPSR statement to the NIST
- --
Date: Fri, 28 May 1993 14:30:44 EST
From: Dave Banisar <uunet!washofc.cpsr.org!banisar>
Organization: CPSR Civil Liberties and Computing Project
Subject: CPSR Seeks Clipper Docs
CPSR Seeks Clipper Docs
PRESS RELEASE May 28, 1993
CPSR Seeks Clipper Documents - Brings Suit Against NSA and National
Security Council
Washington, DC -- Computer Professionals for Social
Responsibility filed suit today in federal district court seeking
information about the government's controversial new cryptography
proposal.
The "Clipper" proposal, announced by the White House at an
April 16 press conference, is based on a technology developed by the
National Security Agency that would allow the government to intercept
computer encoded information. Law enforcement agencies say that
capability this is necessary to protect court ordered wire
surveillance.
But industry groups and civil liberties organizations have raised
questions about the proposal. They cite the risk of abuse, the
potential loss in security and privacy, costs to US firms and
consumers, and the difficulties enforcing the policy.
Marc Rotenberg, CPSR Washington office director, said "The
Clipper plan was developed behind a veil of secrecy. It is not enough
for the White House to hold a few press conferences. We need to know
why the standard was developed, what alternatives were considered, and
what the impact will be on privacy. "
"As the proposal currently stands, Clipper looks a lot like
'desktop surveillance,'" added Rotenberg.
David Sobel, CPSR Legal Counsel, said "CPSR is continuing its
oversight of federal cryptography policy. These decisions are too
important to made in secret, without public review by all interested
parties."
In previous FOIA suits, CPSR obtained records from the General
Services Administration questioning the FBI's digital telephony plan, a
legislative proposal to require that communications companies design
wiretap capability. More recently, CPSR obtained records through the
FOIA revealing the involvement of the National Security Agency in the
development of unclassified technical standards in violation of
federal law.
CPSR is a national membership organization, based in Palo Alto,
CA. Membership is open to the public. For more information about
CPSR, contact CPSR, P.O. Box 717, Palo Alto, CA 9403, 415/322-3778
(tel), 415/322-3798 (fax), cpsr@cpsr.org
- --
Excerpt extracted from -
EFFector Online Volume 5 No. 9 5/28/1993 editors@eff.org
A Publication of the Electronic Frontier Foundation ISSN 1062-9424
EFF Comments to the NIST (the National Institute of Standards and
Technology:
May 27, 1993
Before the
COMPUTER SYSTEM SECURITY AND PRIVACY ADVISORY BOARD
Technology Building, Room B-154
National Institute of Standards and Technology Gaithersburg, MD
20899
COMMENTS OF THE ELECTRONIC FRONTIER FOUNDATION
Regarding
Key Escrow Chip Cryptographic Technology and Government
Cryptographic Policies and Regulations
The Electronic Frontier Foundation (EFF) commends the Computer
System Security and Privacy Advisory Board for offering the public
the opportunity to comment on developments in cryptography and
communications privacy policy. Recent Administration proposals,
including use of the Clipper Chip and establishment of a government-
controlled key escrow system, raise questions that cut to the core of
privacy protection in the age of digital communication technology.
The questions noted by the Advisory Board in its Notice of Open
Meeting (58 FR 28855) reflect a broad range of concerns, from civil
liberties to global competitiveness. The Digital Privacy and Security
Working Group -- a cooperative effort of civil liberties organizations
and corporate users and developers of communication technology
which is chaired by the EFF -- has also submitted over one hundred
questions to the Administration. (These questions are being
submitted to the Advisory Board under separate cover on behalf of
the Working Group.) That there are so many questions demonstrates
the need for a comprehensive review of cryptography and privacy
policy.
We are encouraged that the Administration has expressed a
willingness to undertake such a review. However, it has become clear
that plans for rapid introduction of the Clipper Chip could
unacceptably distort this important policy review. The
Administration has made no secret of the fact that it hopes to use
government purchasing power to promote Clipper as a de facto
standard for encryption. With Clipper on the market, the policy
process will be biased toward a long-term solution such as Clipper
with key escrow. Moreover, the rush to introduce Clipper is already
forcing a hasty policy review which may fail to provide adequate
public dialogue on the fundamental privacy questions which must be
resolved to reach a satisfactory cryptography policy. Based on the
depth and complexity of questions raised by this review, EFF
believes that no solution, with Clipper Chip or otherwise, should be
adopted by the government until the comprehensive cryptography
review initiated by the Administration is complete.
EFF is a nonprofit, public interest organization whose public policy
mission is to insure that the new electronic highways emerging from
the convergence of telephone, cable, broadcast, and other
communications technologies enhance free speech and privacy rights,
and are open and accessible to all segments of society.
In these comments, we will elaborate on questions 1, 2, and 3 listed
in the Advisory Board's Notice. We offer these comments primarily to
raise additional questions that must be answered during the course
of the Administration's policy review.
A. WILL PARTICULAR ENCRYPTION TECHNOLOGIES BE MANDATED OR
PROSCRIBED?: A THRESHOLD QUESTION
Unraveling the current encryption policy tangle must begin with one
threshold question: will there come a day when the federal
government controls the domestic use of encryption through
mandated key escrow schemes or outright prohibitions against the
use of particular encryption technologies? Is Clipper the first step in
this direction? A mandatory encryption regime raises profound
constitutional questions, some of which we will discuss below. So far,
the Administration has not declared that use of Clipper will be
mandatory, but several factors point in that direction:
1. Secrecy of the algorithm justified by need to ensure key escrow
compliance:
Many parties have already questioned the need for a secret
algorithm, especially given the existence of robust, public-domain
encryption techniques. The most common explanation given for use
of a secret algorithm is the need to prevent users from by-passing
the key escrow system proposed along with the Clipper Chip. If the
system is truly voluntary, then why go to such lengths to ensure
compliance with the escrow procedure?
2. How does a voluntary system solve law enforcement's problems?
The major stated rationale for government intervention in the
domestic encryption arena is to ensure that law enforcement has
access to criminal communications, even if they are encrypted. Yet, a
voluntary scheme seems inadequate to meet this goal. Criminals who
seek to avoid interception and decryption of their communications
would simply use another system, free from escrow provisions.
Unless a government-proposed encryption scheme is mandatory, it
would fail to achieve its primary law enforcement purpose. In a
voluntary regime, only the law-abiding would use the escrow
system.
B. POLICY CONCERNS ABOUT GOVERNMENT-RUN KEY ESCROW SYSTEM
Even if government-proposed encryption standards remain
voluntary, the use of key escrow systems still raises serious
concerns:
1. Is it wise to rely on government agencies, or government-selected
private institutions to protect the communications privacy of all who
would someday use a system such as Clipper?
2. Will the public ever trust a secret algorithm with an escrow
system enough to make such a standard widely used?
C. CONSTITUTIONAL IMPLICATIONS OF GOVERNMENT CONTROLS ON
USE OF ENCRYPTION
Beyond the present voluntary system is the possibility that specific
government controls on domestic encryption could be enacted. Any
attempt to mandate a particular cryptographic standard for private
communications, a requirement that an escrow system be used, or a
prohibition against the use of specific encryption algorithms, would
raise fundamental constitutional questions. In order to appreciate the
importance of the concerns raised, we must recognize that we are
entering an era in which most of society will rely on encryption to
protect the privacy of their electronic communications. The following
questions arise:
1. Does a key escrow system force a mass waiver of all users' Fifth
Amendment right against self-incrimination?
The Fifth Amendment protects individuals facing criminal charges
from having to reveal information which might incriminate them at
trial. So far, no court has determined whether or not the Fifth
Amendment allows a defendant to refuse to disclose his or her
cryptographic key. As society and technology have changed, courts
and legislatures have gradually adapted fundamental constitutional
rights to new circumstances. The age of digital communications
brings many such challenges to be resolved. Such decisions require
careful, deliberate action. But the existence of a key escrow system
would have the effect of waiving this right for every person who
used the system in a single step. We believe that this question
certainly deserves more discussion.
2. Does a mandatory key escrow system violate the Fourth
Amendment prohibition against "unreasonable search and seizure"?
In the era where people work for "virtual corporations" and conduct
personal and political lives in cyberspace, the distinction between
communication of information and storage of information is
increasingly vague. The organization in which one works or lives may
constitute a single virtual space, but be physically dispersed. So, the
papers and files of the organization or individual may be moved
within the organization by means of telecommunications technology.
Until now, the law of search and seizure has made a sharp distinction
between, on the one hand, seizures of papers and other items in a
person's physical possession, and on the other hand, wiretapping of
communications. Seizure of papers or personal effects must be
conducted with the owner's knowledge, upon presentation of a
search warrant. Only in the exceptional case of wiretapping, may a
person's privacy be invaded by law enforcement without
simultaneously informing the target. Instantaneous access to
encryption keys, without prior notice to the communicating parties,
may well constitute a secret search, if the target is a virtual
organization or an individual whose "papers" are physically
dispersed. Under the Fourth Amendment, secret searches are
unconstitutional.
3. Does prohibition against use of certain cryptographic techniques
infringe individuals' right to free speech?
Any government restriction on or control of speech is to be regarded
with the utmost scrutiny. Prohibiting the use of a particular form of
cryptography for the express purpose of making communication
intelligible to law enforcement is akin to prohibiting anyone from
speaking a language not understood by law enforcement. Some may
argue that cryptography limitations are controls on the "time, place
and manner" of speech, and therefore subject to a more lenient legal
standard. However, time, place and manner restrictions that have
been upheld by courts include laws which limit the volume of
speakers from interfering with surrounding activities, or those which
confine demonstrators to certain physical areas.
No court has ever upheld an outright ban on the use of a particular
language. Moreover, even a time, place and manner restriction must
be shown to be the "least restrictive means" of accomplishing the
government's goal. It is precisely this question -- the availability of
alternatives which could solve law enforcement's actual problems --
that must be explored before a solution such as Clipper is promoted.
D. PUBLIC PROCESS FOR CRYPTOGRAPHY POLICY
As this Advisory Board is well aware, the Computer Security Act of
1987 clearly established that neither military nor law enforcement
agencies are the proper protectors of personal privacy. When
considering the law, Congress asked, "whether it is proper for a
super-secret agency [the NSA] that operates without public scrutiny
to involve itself in domestic activities...?" The answer was a clear
"no." Recent Administration announcements regarding the Clipper
Chip suggest that the principle established in the 1987 Act has been
circumvented. For example, this Advisory Board was not consulted
with until after public outcry over the Clipper announcements. Not
only does the initial failure to consult eschew the guidance of the
1987 Act, but also it ignored the fact that this Advisory Board was
already in the process of conducting a cryptography review.
As important as the principle of civilian control was in 1987, it is
even more critical today. The more individuals around the country
come to depend on secure communications to protect their privacy,
the more important it is to conduct privacy and security policy
dialogues in public, civilian forums.
CONCLUSION
The EFF thanks the Advisory Board for the opportunity to comment
on these critical public policy issues. In light of the wide range of
difficult issues raised in this inquiry, we encourage the Advisory
Board to call on the Administration to delay the introduction of
Clipper-based products until a thorough, public dialogue on
encryption and privacy policy has been completed.
Respectfully Submitted,
Electronic Frontier Foundation
Jerry Berman
Executive Director
jberman@eff.org
Daniel J. Weitzner
Senior Staff Counsel
djw@eff.org
- --
Excerpt extracted from RISKS Digest (14.65) -
Date: Fri, 21 May 93 16:13:46 EDT
>From: junger@samsara.law.cwru.edu (Peter D. Junger)
Subject: The risks of teaching about computers and the law
A fortnight ago, in order to postpone the necessity of grading
final exams, I started writing a simple-minded encryption program, which
uses a "one-time pad" as a key, for use this Fall in my class on
Computers and the Law. The program is intended to demonstrate certain
things that lawyers who are going to deal with the problems generated by
computers should know: things like the nature of an algorithm and the
fact that any text (that is encoded in binary digits) of length n
contains (if one just has the key) all other texts of length n.
Although in that course we shall mainly be concerned with
copyright and patent issues relating to computer programs, we should
also spend some time on security issues and on government regulation of
computer programs. And that, of course, includes the regulation of the
export of computer programs, including cryptographic programs and
technical information relating to such programs. I shall also have to
discuss cryptographic programs when dealing with issues of computer
security, since it would profit lawyers to be aware of the fact that
cryptography can do far more than the law can to keep one's confidences
confidential. The latter point is, of course, of particular importance
to members of a profession who have a legal and moral duty to keep their
clients' confidences confidential from everyone, but especially from the
agents of the state.
As I was writing this program I realized that it itself, and any
`technical data' relating to it, might be subject to federal export
licensing regulations, since I intended to give copies of it to, and
discuss it with, my students and make it available to anyone who wants
it, even foreigners. Even if I do not put it on an anonymous FTP
server, as I originally planned, there is no way that I can guarantee
that all the students who enroll in my class will be citizens or
permanent residents of the United States.
After a little quick research I have determined that my program
may be--and, in fact, probably is--subject to such licensing, though
whether by the Department of Commerce or that of State is a matter that
it will take some sixty days for the bureaucrats to determine. The
trouble is that the program, which should run on any PC clone running
MSDOS 3 or higher, and which now consists in its entirety of 174 bytes
of 8086 machine code, which I am pretty sure I can get down to 170 bytes
or less, is squarely covered by the definitions of Category XIII of the
U.S. Munitions List (as is my old Captain Midnight Decoder, which I got
during the War for a boxtop--or was it an Ovaltine label?--and change).
The relevant subdivision of Category XIII of the Munitions List
is (b), which provides in relevant part:
(b) Information Security Systems and equipment, cryptographic
devices, software, and components specifically designed or
modified therefor, including:
(1) Cryptographic (including key management) systems,
equipment, assemblies, modules, integrated circuits,
components or software with the capability of maintaining
secrecy or confidentiality of information or information
systems, except cryptographic equipment and software as
follows:
.... [none of the exceptions appear to be applicable to my
program]
There is no exception for encryption software that is so simple minded
that a law teacher, whose only degrees are in English and law, can hack
it out in about six hours, most of which time was spent chasing bugs
that were the result of typos. I estimate that the average computer
literate 12-year old could have written the program in about 20 minutes.
In the course of my researches, which so far have consisted
of speaking to a very pleasant person at the Department of Commerce's
Bureau of Export Administration, to a not very nice major and a slightly
nicer person at the Department of State's Bureau of Politico-Military
Affairs, Office of Defense Trade Controls, and to a not un-nice person,
whose name I was not allowed to know, who supposedly was at NSA, and
wading an inch or so into a seven inch stack of Commerce Department
regulations and a few more inches of statutes, I have concluded that if
I `export' my little program without first getting a license I may be
subject to a fine of not more than $1,000,000, or imprisonment for not
more than ten years, or both.
This isn't so bad, since in the case of the actual program it is
pretty clear that `exporting' means exporting, so, since I don't intend
to export the program, the only problem is that posting it on an FTP
server on the internet gets into a `grey' area (according to the
unknowable at NSA). Of course, if the program is considered to be my
expression--which it must be if it is protected by the copyright
laws--it is probably a violation of the First Amendment to require me to
get a license before I can export it. But since I don't intend to
export it--and the unknowable, on whom I dare not rely, did keep saying
that it was a matter of my intention--I can treat that issue as an
academic problem. (By the way, it is my position that the actual
program--the machine code--not being in any sense expression--cannot
Constitutionally be protected by copyright law; this is a position that
the lower courts have--at least _sub silentio_--uniformly rejected, but
it is a good bet that the Supreme Court will agree with me when it
finally gets around to considering this issue!)
The real trouble is that Category XIII contains as its final
subdivision paragraph (k), which covers
(k) Technical data . . . related to the defense articles listed
in this category.
And that, of course, means that I cannot lawfully export technical data
about my program without first obtaining a license.
But the regulations relating to technical data that is included on the
Munitions List say, in effect, that the `export' of technical data includes
talking about the defense article to which the data relates--which in my case
is my piddling little program--in the presence of someone who is neither a
citizen of the United States nor admitted to permanent residence in the United
States. So, if any foreign students sign up for my course I will be required
to get a license--which I am not sure I can get at all, and certainly will not
be able to get in time to teach my course--before describing the program to my
class, explaining how to use it, and giving them the source code--which, by
the way, I contend _does_ contain expression--to load in with the debug
program.
I admit that I am not greatly concerned about the potential criminal
penalties that might be imposed if I do discuss the program with my students
without a license, and not only because I don't have a million dollars
and--far all I know--may not have ten years. I cannot imagine anyone--except
perhaps that major--who would be stupid enough to try to punish me for
discussing my trivial program with my students.
But how can I teach this particular bit of computer law if the very
act of teaching amounts--at least in theory--to a criminal violation of the
very law that I am teaching? That this is not a logical paradox is an
illustration of the fact that the law is not logic; but I still feel that I am
trapped in an impossible situation.
It is hard for me as a law teacher to believe that this regulatory
scheme that requires me to get a prior license each time that I speak about,
or publish the details of, my trivial program (or, in the alternative, to make
sure that no foreigners get to hear or read what I have to say about it) can
withstand a constitutional challenge on First Amendment grounds.
The "secret" of how to keep a secret in 170 bytes or less is not
something that imposes any conceivable threat to the security of the United
States, especially not when the underlying algorithm is well known to most who
are, and many who aren't, knowledgeable about computers--or, for that matter,
about logic. And thus the government can't constitutionally punish me for
revealing this "secret" of mine or talking and writing about how it works.
And even if the government could constitutionally punish me after the fact,
that does not mean that they can impose a prior restraint on my speaking or
writing about the "secret". Prior restraints on speech or publication--and
especially licensing schemes--are especially vulnerable to constitutional
attack, since the First Amendment provisions relating to the freedom of speech
and of the press were adopted in large part to prevent the federal government
from adopting the type of censorship and licensing that had prevailed in
England under the Tudor and Stuart monarchies.
And yet I am so intimidated and disheartened by this
unconstitutional scheme that I dare not explain in a submission to
Risks, which undoubtedly has foreign subscribers, how my silly little
program works. And even if I were willing to take that risk, I could
not in good conscience impose it on our moderator.
And if I have problems now, just think how ridiculous the
situation will be if the government tries to outlaw all encryption
programs and devices other than the Clipper Chip.
[For those of you who understand how my program works and who
take the effort to write your own encryption program based on that
understanding, I have a special offer. If you will just send me an
E-mail message certifying that you are a United States Citizen, I will
send you (at any address on the internet that is within the United
States), a UUENCODEd key that when applied by your program to this
particular submission to Risks--after all headers have been stripped
off--will produce a working copy of my program, which is a COM file that
runs under MSDOS. (Be sure that your copy of this submission uses the
Carriage Return / Line Feed combination as the End of Line indicator.)]
Peter D. Junger
Case Western Reserve University Law School, Cleveland, OH
Internet: JUNGER@SAMSARA.LAW.CWRU.Edu -- Bitnet: JUNGER@CWRU
- --
From: mnemonic@eff.org (Mike Godwin)
Subject: Cardozo Law Forum article on the Craig Neidorf Computer-Crime
Message-ID: <1992Dec1.155823.27405@eff.org>
Date: Tue, 1 Dec 1992 15:58:23 GMT
Readers of misc.legal and comp.org.eff.talk may be interested in
the following article, which addresses the intersection of
intellectual-property law and criminal law in a computer-crime case.
The article first appeared in September in the Cardozo Law Forum at
Cardozo Law School in New York City.
----------
Some "Property" Problems in a Computer Crime Prosecution
By Mike Godwin
The spread and pervasiveness of computer technology create the
potential both for new kinds of crimes and for new variations of
traditional crimes. Law enforcement, the judiciary, and the legislature
can respond to these potentials in two ways: by seeking new laws to
address new problems, or by attempting to apply old laws (and traditional
notions of crime) in new and unforeseen situations. This article concerns
what hazards may face prosecutors and judges when law enforcement chooses
the latter tactic. In particular, it shows what can happen when
prosecutors uncritically apply intellectual property notions in
prosecuting a defendant under laws passed to protect tangible property.
The government stumbles in a "hacker" case.
In the recent case of U.S. v. Riggs, the Chicago U.S. Attorney's
office prosecuted two young men, Robert Riggs and Craig Neidorf, on counts
of wire fraud (18 U.S.C. 1343), interstate transportation of stolen
property (18 U.S.C. 2314) and computer fraud (18 U.S.C. 1030). Of these
statutes, only the last was passed specifically to address the problems of
unauthorized computer intrusion; the other two are "general purpose"
federal criminal statutes that are used by the government in a wide range
of criminal prosecutions. The wire fraud statute includes as an element
the taking (by fraudulent means) of "money or property," while the
interstate-transportation-of-stolen-property (ITSP) statute requires,
naturally enough, the element of "goods, wares, merchandise, securities or
money, of the value of $5,000 or more." (I do not address here the extent
to which the notions of "property" differ between these two federal
statutes. It is certain that they do differ to some extent, and the
interests protected by the wire-fraud statute were expanded in the 1980s
by Congress to include "the intangible right to honest services." 18
U.S.C. 1346.. Even so, the prosecution in the Riggs case relies not on
1346, but on intellectual-property notions, which are the focus of this
article.) The 18 U.S.C. 1030 counts against Neidorf were dropped in the
government's June 1990 superseding indictment, the indictment actually
used at Neidorf's trial in July 1990.
The Riggs case is based on the following facts: Robert Riggs, a
computer "hacker" in his early '20s, discovered that he could easily gain
access to an account on a computer belonging to Bell South, one of the
Regional Bell Operating Companies (RBOCs). The account was highly
insecure--access to it did not require a password (a standard, if not
always effective, security precaution). While exploring this account,
Riggs discovered a word-processing document detailing procedures and
definitions of terms relating the Emergency 911 system ("E911 system").
Like many hackers, Riggs had a deep curiosity about the workings of this
country's telephone system. (This curiosity among young hackers is a
social phenomenon that has been documented for more than 20 years. See,
e.g., Rosenbaum, "Secrets of the Little Blue Box," Esquire, October 1971;
and Barlow, "Crime and Puzzlement: In Advance of the Law on the Electronic
Frontier," Whole Earth Review, September 1990.)
Riggs knew that his discovery would be of interest to Craig Neidorf,
a Missouri college student who, while not a hacker himself, was an amateur
journalist whose electronically distributed publication, Phrack, was
devoted to articles of interest to computer hackers. Riggs sent a copy of
the E911 document to Neidorf over the telephone line--using computer and
modem--and Neidorf edited the copy to conceal its origin. Among other
things, Neidorf removed the statements that the information contained in
the document was proprietary and not for distribution. Neidorf then sent
the edited copy back to Riggs for the latter's review; following Riggs's
approval of the edited copy, Neidorf published the E911 document in the
February 24, 1989, issue of Phrack. Some months following publication of
the document in Phrack, both Riggs and Neidorf were caught and questioned
by the Secret Service, and all systems that might contain the E911
document were seized pursuant to evidentiary search warrants.
Riggs and Neidorf were indicted on the counts discussed supra; Riggs,
whose unauthorized access to the BellSouth computer was difficult to
dispute, later pled guilty to wire fraud for that conduct. Neidorf pled
innocent on all counts, arguing, inter alia, that his conduct was
protected by the First Amendment, and that he had not deprived Bell South
of property as that notion is defined for the purposes of the wire fraud
and ITSP statutes.
The two defenses are closely related. Under the First Amendment, the
presumption is that information is free, and that it can readily be
published and republished. For this reason, information gives rise to a
property interest only if it passes certain legal tests. Law enforcement
cannot simply assume that whenever information has been copied from a
private computer system a theft has taken place.
In Neidorf's case, as it turns out, this is essentially what the
Secret Service and the U.S. Attorney's office did assume. The assumption
came back to haunt the government when it was revealed during trial that
the information contained within the E911 document did not meet any of the
relevant legal tests to be established as a property interest.
How information becomes stealable property.
In order for information to be stolen property, it must first be
property. There are only a few ways that information can qualify as a
property interest, and two of these--patent law and copyright law--are
creatures of federal statute, pursuant to an express Constitutional grant
of legislative authority. (U.S. Constitution, Article I, Sec. 8, clause
8.) Patent protections were clearly inapplicable in the Neidorf case; the
E911 document, a list of definitions and procedures, did not constitute an
invention or otherwise patentable process or method. Copyright law might
have looked more promising to Neidorf's prosecutors, since it is well
established that copyrights qualify as property interests in some contexts
(e.g., the law of inheritance).
Unfortunately for the government, the Supreme Court has explicitly
stated that copyrighted material is not property for the purposes of the
ITSP statute. In Dowling v. United States, 473 U.S. 207 (1985), the Court
held that interests in copyright are outside the scope of the ITSP
statute. (Dowling involved a prosecution for interstate shipments of
pirated Elvis Presley recordings.) In reaching its decision, the Court
held, inter alia, that 18 U.S.C. $ 2314 contemplates "a physical identity
between the items unlawfully obtained and those eventually transported,
and hence some prior physical taking of the subject goods." Unauthorized
copies of copyrighted material do not meet this "physical identity"
requirement.
The Court also reasoned that intellectual property is different in
character from property protected by generic theft statutes: "The
copyright owner, however, holds no ordinary chattel. A copyright, like
other intellectual property, comprises a series of carefully defined and
carefully delimited interests to which the law affords correspondingly
exact protections." The Court went on to note that a special term of art,
"infringement," is used in reference to violations of copyright
interests--thus undercutting any easy equation between unauthorized copying
and "stealing" or "theft."
It is clear, then, that in order for the government to prosecute the
unauthorized copying of computerized information as a theft, it must rely
on other theories of information-as-property. Trade secret law is one
well-established legal theory of this sort. Another is the
breach-of-confidence theory articulated recently by the Supreme Court in
Carpenter v. United States, 108 S.Ct. 316 (1987). I will discuss each
theory in turn below.
Trade Secrets
Trade secrets are generally creatures of state law, and most
jurisdictions have laws that criminalize the violations of a trade-secret
holder's rights in the secret. There is no general federal definition of
what a trade secret is, but there have been federal cases in which
trade-secret information has been used to establish the property element
of a federal property crime. See, e.g., United States v. Bottone, 365 F.2d
389 (2d Cir.), cert denied, 385 U.S. 974 (1966), affirming ITSP
convictions in a case involving a conspiracy to steal drug-manufacturing
bacterial cultures and related documents from a pharmaceutical company and
sell them in foreign markets. (In Bottone, a pre-Dowling appellate court
expressed a willingness to interpret 18 U.S.C. $ 2314 as encompassing the
interstate transportation of copies of documents detailing the
drug-manufacturing process, i.e., it did not require the "physical
identity" element discussed supra. Recognizing possible problems with this
approach, however, the appellate court reasoned in the alternative that
the bacterial cultures themselves provided a sufficient nexus of a
tangible property interest to justify application of the ITSP statute;
this alternative analysis may render Bottone consistent with Dowling. It
should be noted that the post-Dowling judge in Riggs expressed, in his
denial of a motion to dismiss, 739 F.Supp. 414 (N.D.Ill, 1990), a similar
willingness not to require actual physical identity as a predicate for
ITSP. An appellate court later criticized this decision. U.S. v. Brown,
925 F.2d 1301 (1991).)
The problem in using a trade secret to establish the property element
of a theft crime is that, unlike traditional property, information has to
leap several hurdles in order to be established as a trade secret.
Trade secret definitions vary somewhat from state to state, but the
varying definitions typically have most elements in common. One good
definition of "trade secret" is outlined by the Supreme Court in Kewanee
Oil Co. v. Bicron Corp., 416 U.S. 470 (1974): "a trade secret may consist
of any formula, pattern, device or compilation of information which is
used in one's business, and which gives one an opportunity to obtain an
advantage over competitors who do not know or use it. It may be a formula
for a chemical compound, a process of manufacturing, treating or
preserving materials, a pattern for a machine or other device, or a list
of customers." The Court went further and listed the particular
attributes of a trade secret
* The information must, in fact, be secret--"not of public knowledge
or of general knowledge in the trade or business."
* A trade secret remains a secret if it is revealed in confidence to
someone who is under a contractual or fiduciary obligation, express or
implied, not to reveal it.
* A trade secret is protected against those who acquire via
unauthorized disclosure, violation of contractual duty of confidentiality,
or through "improper means." ("Improper means" includes such things as
theft, bribery, burglary, or trespass. The Restatement of Torts at 757
defines such means as follows: "In general they are means which fall below
the generally accepted standards of commercial morality and reasonable
conduct.")
* A court will allow a trade secret to be used by someone who
discovered or developed the trade secret independently (that is, without
taking it in some way from the holder), or if the holder does not take
adequate precautions to protect the secret.
* An employee or contractor who, while working for a company,
develops or discovers a trade secret, generally creates trade secret
rights in the company.
The holder of a trade secret may take a number of steps to meet its
obligation to keep the trade secret a secret. These may include:
a) Labelling documents containing the trade secret "proprietary" or
"confidential" or "trade secret" or "not for distribution to the public;"
b) Requiring employees and contractors to sign agreements not to
disclose whatever trade secrets they come in contact with;
c) destroying or rendering illegible discarded documents containing
parts or all of the secret, and;
d) restricting access to areas in the company where a nonemployee, or
an employee without a clear obligation to keep the information secret,
might encounter the secret. Dan Greenwood's Information Protection
Advisor, April 1992, page 5.
Breach-of-confidence
Even if information is not protected under the federal patent and
copyright schemes, or under state-law trade-secret provisions, it is
possible, according to the Supreme Court in Carpenter, for such
information to give rise to a property interest when its unauthorized
disclosure occurs via the breach of confidential or fiduciary
relationship. In Carpenter, R. Foster Winans, a Wall Street Journal
reporter who contributed to the Journal's "Heard on the Street" column,
conspired with Carpenter and others to reveal the contents of the column
before it was printed in the Journal, thus allowing the conspirators to
buy and sell stock with the foreknowledge that stock prices would be
affected by publication of the column. Winans and others were convicted
of wire fraud; they appealed the wire-fraud convictions on the grounds
that had not deprived the Journal of any money or property.
It should be noted that this is not an "insider trading" case, since
Winans was no corporate insider, nor was it alleged that he had received
illegal insider tips. The "Heard on the Street" column published
information about companies and stocks that would be available to anyone
who did the requisite research into publicly available materials. Since
the information reported in the columns did not itself belong to the
Journal, and since the Journal planned to publish the information for a
general readership, traditional trade secret notions did not apply. Where
was the property interest necessary for a wire-fraud conviction?
The Supreme Court reasoned that although the facts being reported in
the column were not exclusive to the Journal, the Journal's
right--presumably based in contract--to Winans' keeping the information
confidential gave rise to a property interest adequate to support a
wire-fraud conviction. Once the Court reached this conclusion, upholding
the convictions of the other defendants followed: even if one does not
have a direct fiduciary duty to protect a trade secret or confidential
information, one can become civilly or criminally liable if one conspires
with, solicits, or aids and abets a fiduciary to disclose such information
in violation of that person's duty. The Court's decision in Carpenter has
received significant criticism in the academic community for its expansion
of the contours of "intangible property," but it remains good law today.
How the theories didn't fit
With these two legal approaches--trade secrets and breach of
confidence--in mind, we can turn back to the facts of the Riggs case and
see how well, or how poorly, the theories applied in the case of Craig
Neidorf.
With regard to any trade-secret theory, it is worth noting first of
all that the alleged victim, BellSouth, is a Regional Bell Operating
Company--a monopoly telephone-service provider for a geographic region in
the United States. Recall the observation in Kewanee Oil, supra, that a
trade secret "gives one an opportunity to obtain an advantage over
competitors who do not know or use it." There are strong arguments
that--at least so far as the provision of Emergency 911 service
goes--BellSouth has no "competitors" within any normal meaning of the term.
And even if BellSouth did have competitors, it is likely that they would
both know and use the E911 information, since the specifications of this
particular phone service are standardized among the regional Bells.
Moreover, as became clear in the course of the Neidorf trial, the
information contained in the E911 document was available to the general
public as well, for a nominal fee. (One of the dramatic developments at
trial occurred during the cross-examination of a BellSouth witness who had
testified that the E911 document was worth nearly $80,000. Neidorf's
counsel showed her a publication containing substantially the same
information that was available from a regional Bell or from Bellcore, the
Bells' research arm, for $13 to any member of the public that ordered it
over an 800 number.) Under the circumstances, if the Bells wanted to
maintain the E911 information as a trade secret, they hadn't taken the
kind of steps one might normally think a keeper of a secret would take.
BellSouth had, however, taken the step of labelling the E911 document
as "NOT TO BE DISCLOSED OUTSIDE OF BELLSOUTH OR ITS SUBSIDIARIES" (it was
this kind of labelling that Neidorf attempted to remove as he edited the
document for publication in Phrack). This fact may have been responsible
for the federal prosecutors' oversight in not determining prior to trial
whethe E911 document met the tests of trade-secret law. It is possible
that prosecutors, unfamiliar with the nuances of trade-secret law, read
the "proprietary" warnings and, reasonining backwards, concluded that the
information thus labelled must be trade-secret information. If so, this
was a fatal error on the government's part. In the face of strong
evidence that the E911 document was neither secret nor competitively or
financially very valuable, any hope the government had of proving the
document to be a trade secret evaporated. (Alternatively, the government
may have reasoned that the E911 information could be used by malicious
hackers to damage the telephone system in some way. The trial transcript
shows instances in which the government attempted to elicit information of
this sort. It should be noted, however, that even if the information did
lend itself to abuse and vandalism, this fact alone does not bring it
within the scope of trade-secret law.)
Nor did the facts lend themselves to a Carpenter-like theory based on
breach of confidence; Neidorf had no duties to BellSouth not to disclose
its information. Neither did Riggs, from whom Neidorf acquired a copy of
the document. The Riggs case lacks the linchpin necessary for a
conviction based on Carpenter--in order for nonfiduciaries to be convicted,
there must be a breaching fiduciary involved in the scheme in some way.
There can be no breach of a duty of confidence when there is no duty to be
breached.
Thus, when its trade-secret theory of the E911 document was
demolished in mid-trial, the government had no fall-back theory to rely on
with regard to its property-crime counts, and the prosecution quickly
sought a settlement on terms favorable to Neidorf, dropping prosecution of
the case in return for Neidorf's agreement to a pre-trial diversion on one
minor count.
The lesson to be learned from Riggs is that it is no easy task to
establish the elements of a theft crime when the property in question is
information. There are good reasons, in a free society, that this should
be so--the proper functioning of free speech and a free press require that
information be presumptively protected from regulation by government or by
private entities invoking the civil or criminal law property protections.
The government in Riggs failed in its duty to recognize this presumption
by failing to make the necessary effort to understand the intellectual
property issues of the case. Had it done so, Neidorf might have been
spared an expensive and painful trial, and the government might have been
spared a black eye.*
------
*See, e.g., "Score One for the Hackers of America," NEWSWEEK, Aug. 6
1990, page 48, and "Dial 1-800 ... for BellSouth 'Secrets',"
COMPUTERWORLD, Aug. 6, 1990, page 8.
_______________________________________________
Mike Godwin, a 1990 guaduate of the University to Texas School of
Law, is staff counsel for the Electronic Frontier Foundation. EFF filed an
amicus curiae brief in the Neidorf case, arguing that Neidorf's attempted
publication of the E911 document was protected speech under the First
Amendment. Godwin received a B.A. in liberal arts from the University of
Texas at Austin in 1980. Prior to law school, Godwin worked as a
journalist and as a computer consultant.
- --
Date: Wed, 2 Jun 1993 17:08:40 EST
Sender: Computer Professionals for Social Responsibility
<uunet!VTVM2.CC.VT.EDU!CPSR%GWUVM.BITNET>
From: David Sobel <uunet!washofc.cpsr.org!dsobel>
Organization: CPSR Civil Liberties and Computing Project
Subject: CPSR NIST Crypto Statement
CPSR NIST Crypto Statement
==============================================
Department of Commerce
National Institute of Standards and Technology
Computer System Security and Privacy Advisory Board
Review of Cryptography Policy
June 1993
Statement of CPSR Washington office
Marc Rotenberg, director
(rotenberg@washofc.cpsr.org)
with David Sobel, legal counsel,
Dave Banisar, policy analyst
Mr. Chairman, members of the Advisory Panel, thank you for the
opportunity to speak today about emerging issues on cryptography
policy.
My name is Marc Rotenberg and I am director of the CPSR
Washington office. Although CPSR does not represent any computer
firm or industry trade association, we speak for many in the
computer profession who value privacy and are concerned about the
government's Clipper proposal.
During the last several years CPSR has organized several meetings
to promote public discussion of cryptography issues. We have also
obtained important government documents through the Freedom of
Information Act. We believe that good policies will only result if the
public, the profession, and the policy makers are fully informed
about the significance of these recent proposals.
We are pleased that the Advisory Board has organized hearings.
This review of cryptography policy will help determine if the Clipper
proposal is in the best interests of the country. We believe that a
careful review of the relevant laws and policies shows that the key
escrow arrangement is at odds with the public interest, and that
therefore the Clipper proposal should not go forward.
Today I will address issues 1 through 3 identified in the NIST
announcement, specifically the policy requirements of the Computer
Security Act, the legal issues surrounding the key escrow
arrangement, and the importance of privacy for network
development.
1. CRYPTOGRAPHY POLICY
The first issue concerns the 1987 statute enacted to improve
computer security in the federal government, to clarify the
responsibilities of NIST and NSA, and to ensure that technical
standards would serve civilian and commercial needs. The Computer
Security Act, which also established this Advisory Panel, is the true
cornerstone of cryptography policy in the United States. That law
made clear that in the area of unclassified computing systems, the
Department of Commerce and not the Department of Defense, would
be responsible for the development of technical standards. It
emphasized public accountability and stressed open decision-making.
The Computer Security Act grew out of a concern that classified
standards and secret meetings would not serve the interests of the
general public. As the practical applications for cryptography have
moved from the military and intelligence arenas to the commercial
sphere, this point has become clear. There is also clearly a conflict of
interest when an agency tasked with signal interception is also given
authority to develop standards for network security.
In the spirit of the Computer Security Act, NIST set out in 1989 to
develop a public key standard FIPS. In a memo dated May 5, 1989
and obtained by CPSR through the Freedom of Information Act, NIST
said that it planned:
to develop the necessary public-key based security
standards. We require a public-key algorithm for
calculating digital signatures and we also require a
public-key algorithm for distributing secret keys.
NIST then went on to define the requirements of the standard:
The algorithms that we use must be public, unclassified,
implementable in both hardware or software, usable by
federal Agencies and U.S. based multi-national
corporation, and must provide a level of security
sufficient for the protection of unclassified, sensitive
information and commercial propriety and/or valuable
information.
The Clipper proposal and the full-blown Capstone configuration,
which incorporates the key management function NIST set out to
develop in 1989, is very different from the one originally conceived
by NIST.
% The Clipper algorithm, Skipjack, is classified,
% Public access to the reasons underlying the proposal is
restricted,
% Skipjack can be implemented only in tamper-proof
hardware,
% It is unlikely to be used by multi-national corporations,
and
% Its security remains unproven.
The Clipper proposal undermines the central purpose of the
Computer Security Act. Although intended for broad use in
commercial networks, it was not developed at the request of either
U.S. business or the general public. It does not reflect public goals.
Rather it reflects the interests of one secret agency with the
authority to conduct foreign signal intelligence and another
government agency responsible for law enforcement investigations.
It is our belief that the Clipper proposal clearly violates the intent
of the Computer Security Act of 1987.
What is the significance of this? It is conceivable that an expert
panel of cryptographers will review the Skipjack algorithm and find
that it lives up its billing, that there is no "trap door" and no easy
way to reverse-engineer. In fact, the White House has proposed just
such a review process
But is this process adequate? Is this the procedure the Advisory
Board would endorse for the development of widespread technical
standards? The expert participants will probably not be permitted
to publish their assessments of the proposal in scientific journals,
further review of the standard will be restricted, and those who are
skeptical will remain in the dark about the actual design of the chip.
This may be an appropriate process for certain military systems, but
it is clearly inappropriate for a technical standard that the
government believes should be widely incorporated into the
communications infrastructure.
Good government policy requires that certain process goals be
satisfied. Decisions should be made in the open. The interests of the
participating agencies should be clear. Agencies should be
accountable for their actions and recommendations. Black boxes and
government oversight are not compatible.
There is an even greater obligation to promote open decisions
where technical and scientific issues are at stake. Innovation
depends on openness. The scientific method depends on the ability
of researchers to "kick the tires" and "test drive" the product. And,
then, even if it is a fairly good design, additional testing encourages
the development of new features, improved performance and
reduced cost. Government secrecy is incompatible which such a
development process.
Many of these principles are incorporated into the Computer
Security Act and the Freedom of Information Act. The current
government policy on the development of unclassified technical
standards, as set out in the Computer Security Act, is a very good
policy. It emphasizes public applications, stresses open review, and
ensures public accountability. It is not the policy that is flawed. It is
the Clipper proposal.
To accept the Clipper proposal would be to endorse a process that
ran contrary to the law, that discourages innovation, and that
undermines openness.
2. LEGAL AND CONSTITUTIONAL ISSUES
There are several legal and constitutional issues raised by the
government's key escrow proposal.
The premise of the Clipper key escrow arrangement is that the
government must have the ability to intercept electronic
communications, regardless of the economic or societal costs. The
FBI's Digital Telephony proposal, and the earlier Senate bill 266, was
based on the same assumption.
There are a number of arguments made in defense of this
position: that privacy rights and law enforcement needs must be
balanced, or that the government will be unable to conduct criminal
investigations without this capability.
Regardless of how one views these various claims, there is one
point about the law that should be made very clear: currently there
is no legal basis -- in statute, the Constitution or anywhere else --
that supports the premise which underlies the Clipper proposal. As
the law currently stands, surveillance is not a design goal. General
Motors would have a stronger legal basis for building cars that could
not go faster than 65 miles per hour than AT&T does in marketing a
commercial telephone that has a built-in wiretap capability. In law
there is simply nothing about the use of a telephone that is
inherently illegal or suspect.
The federal wiretap statute says only that communication service
providers must assist law enforcement in the execution of a lawful
warrant. It does not say that anyone is obligated to design systems
to facilitate future wire surveillance. That distinction is the
difference between countries that restrict wire surveillance to
narrow circumstances defined in law and those that treat all users of
the telephone network as potential criminals. U.S. law takes the first
approach. Countries such as the former East Germany took the
second approach. The use of the phone system by citizens was
considered inherently suspect and for that reason more than 10,000
people were employed by the East German government to listen in
on telephone calls.
It is precisely because the wiretap statute does not contain the
obligation to incorporate surveillance capability -- the design
premise of the Clipper proposal -- that the Federal Bureau of
Investigation introduced the Digital Telephony legislation. But that
legislation has not moved forward on Capitol Hill and the law has
remained unchanged. The Clipper proposal attempts to accomplish
through the standard-setting and procurement process what the
Congress has been unwilling to do through the legislative process.
On legal grounds, adopting the Clipper would be a mistake. There
is an important policy goal underlying the wiretap law. The Fourth
Amendment and the federal wiretap statute do not so much balance
competing interests as they erect barriers against government excess
and define the proper scope of criminal investigation. The purpose
of the federal wiretap law is to restrict the government, it is not to
coerce the public.
Therefore, if the government endorses the Clipper proposal, it will
undermine the basic philosophy of the federal wiretap law and the
fundamental values embodied in the Constitution. It will establish a
technical mechanism for signal interception based on a premise that
has no legal foundation. I am not speaking rhetorically about "Big
Brother." My point is simply that the assumption underlying the
Clipper proposal is more compatible with the practice of telephone
surveillance in the former East Germany than it is with the narrowly
limited circumstances that wire surveillance has been allowed in the
United States.
There are a number of other legal issues that have not been
adequately considered by the proponents of the key escrow
arrangement that the Advisory Board should examine. First, not all
lawful wiretaps follow a normal warrant process. It is critical that
the proponents of Clipper make very clear how emergency wiretaps
will be conducted before the proposal goes forward. Second, there
may be civil liability issues for the escrow agents if there is abuse or
compromise of the keys. Escrow agents may be liable for any harm
that results. Third, there is a Fifth Amendment dimension to the
proposed escrow key arrangement if a network user is compelled to
disclose his or her key to the government in order to access a
communications network. Each one of these issues should be
examined.
There is also one legislative change that we would like the
Advisory Board to consider. During our FOIA litigation, the NSA cited
a 1951 law to withhold certain documents that were critical to
understand the development of the Digital Signature Standard. The
law, passed grants the government the right restrict the disclosure
of any classified information pertaining to cryptography. While the
government may properly withhold classified information in FOIA
cases, the practical impact of this particular provision is to provide
another means to insulate cryptographic policy from public review.
Given the importance of public review of cryptography policy, the
requirement of the Computer Security Act, and the Advisory Board's
own commitment to an open, public process, we ask the Advisory
Board to recommend to the President and to the Congress that
section 798 be repealed or substantially revised to reflect current
circumstances.
This is the one area of national cryptography policy where we
believe a change is necessary.
3. INDIVIDUAL PRIVACY
Communications privacy remains a critical test for network
development. Networks that do not provide a high degree of privacy
are clearly less useful to network users. Given the choice between a
cryptography product without a key escrow and one with a key
escrow, it would be difficult to find a user who would prefer the key
escrow requirement. If this proposal does go forward, it will not be
because network users or commercial service providers favored it.
Many governments are now facing questions about restrictions on
cryptography similar to the question now being raised in this
country. It is clear that governments may choose to favor the
interests of consumers and businesses over law enforcement. Less
than a month ago, the government of Australia over-rode the
objections of law enforcement and intelligence agencies and allowed
the Australian telephone companies to go forward with new digital
mobile phone networks, GSM, using the A5 robust algorithm. Other
countries will soon face similar decisions. We hope that they will
follow a similar path
To briefly summarize, the problem here is not the existing law on
computer security or policies on cryptography and wire surveillance.
The Computer Security Act stresses public standards, open review,
and commercial applications. The federal wiretap statute is one of
the best privacy laws in the world. With the exception of one
provision in the criminal code left over from the Cold War, our
current cryptography policy is very good. It reflects many of the
values -- individual liberty, openness, government accountability --
that are crucial for democratic societies to function.
The problem is the Clipper proposal. It is an end-run around
policies intended to restrict government surveillance and to ensure
agency accountability. It is an effort to put in place a technical
configuration that is at odds with the federal wiretap law and the
protection of individual privacy. It is for these reasons that we ask
the Advisory Board to recommend to the Secretary of Commerce, the
White House, and the Congress that the current Clipper proposal not
go forward.
I thank you for the opportunity to speak with you about these
issues. I wish to invite the members of the Advisory Committee to
the third annual CPSR Privacy and Cryptography conference that will
be held Monday, June 7 in Washington, DC at the Carnegie
Endowment for International Peace. That meeting will provide an
opportunity for further discussion about cryptography policy.
ATTACHMENTS
"TWG Issue Number: NIST - May 5, 1989," document obtained
by CPSR as a result of litigation under the Freedom of
Information Act.
"U.S. as Big Brother of Computer Age," The New York Times,
May 6, 1993, at D1.
"Keeping Fewer Secrets," Issues in Science and Technology, vol.
IX, no. 1 (Fall 1992)
"The Only Locksmith in Town," The Index on Censorship
(January 1990)
[The republication of these articles for the non-commercial purpose
of informing the government about public policy is protected by
section 107 of the Copyright Act of 1976]
- --
End of Legal Net News v1i8